Jackknife Approximations to Bootstrap Estimates

نویسندگان

  • RUDOLF BERAN
  • Rudolf Beran
چکیده

Let T be an estimate of the form Tn = T(F ) where F is the nn n' n sample cdf of n iid observations and T is a locally quadratic functional defined on cdf's. Then, the normalized jackknife estimates for bias, skewness, and variance of Tn approximate closely their bootstrap counterparts. Each of these estimates is consistent. Moreover, the jackknife and bootstrap estimates of variance are asymptotically normal and asymptotically minimax. The main result: the first-order Edgeworth expansion estimate for the distribution of n1T2(Tn-T(F)), with F being the actual cdf of each observation and the expansion coefficients being estimated by jackknifing, is asymptotically equivalent to the corresponding bootstrap distribution estimate, up to and including terms of order n 1/2 Both distribution estimates are asymptotically minimax. The jackknife Edgeworth expansion estimate suggests useful corrections for skewness and bias to upper and lower confidence bounds for T(F). AMS 1980 subJect classification: Primary 62G05; Secondary 62E20

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Jackknife Multiplier Bootstrap: Finite Sample Approximations to the U-process Supremum with Applications

This paper is concerned with finite sample approximations to the supremum of a nondegenerate U -process of a general order indexed by a function class. We are primarily interested in situations where the function class as well as the underlying distribution change with the sample size, and the U -process itself is not weakly convergent as a process. Such situations arise in a variety of modern ...

متن کامل

Confidence Intervals for Random Forests Confidence Intervals for Random Forests: The Jackknife and the Infinitesimal Jackknife

We study the variability of predictions made by bagged learners and random forests, and show how to estimate standard errors for these methods. Our work builds on variance estimates for bagging proposed by Efron (1992, 2012) that are based on the jackknife and the infinitesimal jackknife (IJ). In practice, bagged predictors are computed using a finite number B of bootstrap replicates, and worki...

متن کامل

Confidence intervals for random forests: the jackknife and the infinitesimal jackknife

We study the variability of predictions made by bagged learners and random forests, and show how to estimate standard errors for these methods. Our work builds on variance estimates for bagging proposed by Efron (1992, 2013) that are based on the jackknife and the infinitesimal jackknife (IJ). In practice, bagged predictors are computed using a finite number B of bootstrap replicates, and worki...

متن کامل

Bias Reduction for k – Sample Functionals

We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate pth order if its bias has magnitude n 0 as n0 → ∞, where n0 is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require O(N) calcul...

متن کامل

Analytic Bias Reduction for k-Sample Functionals

We give analytic methods for nonparametric bias reduction that remove the need for computationally intensive methods like the bootstrap and the jackknife. We call an estimate pth order if its bias has magnitude n 0 as n0 → ∞, where n0 is the sample size (or the minimum sample size if the estimate is a function of more than one sample). Most estimates are only first order and require O(N) calcul...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008